Polynomial Bounds for the VC-Dimension of Sigmoidal, Radial Basis Function, and Sigma-pi Networks

نویسنده

  • Akito SAKURAI
چکیده

W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks including sigmoidal, radial basis functions, and sigma-pi networks, where h is the number of hidden units and W is the number of adjustable parameters, which extends Karpinski and Macintyre's resent results.* The class is characterized by polynomial input functions and activation functions that are solutions of rst order di erential equations with rational function coe cients and that can be represented in an implicit function form of a composition of the natural logarithm and polynomials. O(W logh) is a lower bound for the VC-dimension of sigmoidal, radial basis function, and sigma-pi networks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

VC Dimension of Sigmoidal and General Pfaffian Networks

We introduce a new method for proving explicit upper bounds on the VC Dimension of general functional basis networks, and prove as an application, for the rst time, that the VC Dimension of analog neural networks with the sigmoidal activation function (y) = 1=1+e ?y is bounded by a quadratic polynomial O((lm) 2) in both the number l of programmable parameters, and the number m of nodes. The pro...

متن کامل

Polynomial Bounds for VC Dimension of Sigmoidal and General Pfaffian Neural Networks

We introduce a new method for proving explicit upper bounds on the VC Dimension of general functional basis networks, and prove as an application, for the rst time, that the VC Dimension of analog neural networks with the sigmoidal activation function (y) = 1=1+e ?y is bounded by a quadratic polynomial O((lm) 2) in both the number l of programmable parameters, and the number m of nodes. The pro...

متن کامل

On the Complexity of Computing and Learning with Multiplicative Neural Networks

In a great variety of neuron models, neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units that multiply their inputs instead of summing them and thus allow inputs to interact nonlinearly. The class of multiplicative neural networks comprises such widely known and well-studied network types as higher-order networks ...

متن کامل

Radial Basis Function Neural Networks Have Superlinear VC Dimension

We establish superlinear lower bounds on the Vapnik-Chervonen-kis (VC) dimension of neural networks with one hidden layer and local receptive eld neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension (W log k), where W is the number of parameters and k the number of nodes. This signiicantly improves the previousl...

متن کامل

Automatic Capacity Tuning of Very Large Vc-dimension Classiers

Large VC-dimension classiers can learn dicult tasks, but are usually impractical because they generalize well only if they are trained with huge quantities of data. In this paper we show that even very high-order polynomial classiers can be trained with a small amount of training data and yet generalize better than classiers with a smaller VC-dimension. This is achieved with a maximum margin al...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995